AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Hallucination Detection

# Hallucination Detection

Phi3 Hallucination Judge Merge
MIT
This model is designed to detect hallucination phenomena in language model outputs, i.e., responses that are coherent but factually incorrect or out of context.
Large Language Model Transformers
P
grounded-ai
63
1
TD HallOumi 3B
A claim verification model fine-tuned from Llama-3.2-3B-Instruct, specifically designed to detect hallucinations or unsupported statements in AI-generated text.
Text Classification English
T
TEEN-D
46
2
Xlm Roberta Mushroom Qa
This model is specifically fine-tuned for SemEval 2025 Task3: Mu-SHROOM competition to identify hallucinated text segments in large language model outputs.
Large Language Model Transformers
X
MichielPronk
71
2
Hallucination Evaluation Model
Apache-2.0
HHEM-2.1-Open is a hallucination detection model developed by Vectara, designed to evaluate the consistency between content generated by large language models and given evidence.
Large Language Model Transformers English
H
vectara
229.46k
280
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase